36 resultados para Lot sizing and scheduling

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A discrete event model was used to examine the effect of machine downtime and operating policy on the long-run average cost of an automotive stamping line. Operating policy refers to the selection of a target batch size and the circumstances under which a line stoppage will lead to the current batch being abandoned. It is assumed that the abandon/resume decision is based solely on the severity of the problem (i.e., repair cost) and the fraction of the batch completed. A method of identifying low-cost operating policies is presented using data obtained from a real stamping plant. It is found that, within a single part framework, this approach results in significantly lower average costs than are currently achieved. It is also demonstrated that by varying the model parameters it is possible to measure the potential benefits arising from process modifications (e.g., decreased die-set times). This can be used to identify the areas where improvements will have the greatest impact on cost and is particularly useful when assessing the expected return on a potential investment. A multiple-part extension to the model is suggested and the potential benefits discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction Planning and Scheduling is taught for the fIrst time ill Semester 2, 2004 in the School of Architecture and Building, Deakin University. During the unit development process and the implementation of teaching activities, several issues arose in relation to implementing computer-aided construction scheduling and unit delivery in a unitary environment. Although various types of construction planning and scheduling software have been developed and applied, none of them can be run inside an online teaching software package, which provides powerful functions in administration. This research aims to explore the strategies to connect a project planning and scheduling software package and an online ~aching and learning software package by a Web-based support platform so that both the lecturer and students can draw up and communicate a construction plan or schedule with tables and fIgures. The key techniques of this supportive platfonn are idt;nlifies and they include a web-based graphically user-interfaced, dynamic and distributed multimedia data acquisition mechanism, which accepts users' drawings and retrieval information from canvas and stores the multimedia data ona server for further usage. This paper demonstrates the techniques and principals needed to construct such a multimedia data acquisition tool. This. research will fill the gap.in the literature in respect to an online pedagogical solution to an existing problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At the heart of this study is my interest in the way in which a religious community establishes its sense of identity and its boundaries in relation to other groups. I explore the case of Israel's attitude towards her eastern neighbours, the Moabites and Ammonites, as portrayed in Tanakh, the Hebrew Bible. Most commentary from the last one hundred years privileges one particular view of Moab and Ammon as traditional enemies of Israel. I aim to show the validity of readings of the biblical accounts that reveal a more complex relationship between Israel and her neighbours. Tanakh exhibits a dialectic between eirenic and hostile viewpoints. The stories of Abraham and Lot, who are presented as ancestors of Israel and of Moab and Ammon, to some degree represent Israel’s understanding of her neighbours. Conventional commentaries take for granted the accepted orthodoxy of Judaism, Christianity and Islam concerning Abraham and his significance in terms of faith and righteousness and blessing and covenant. As none of these notions is specifically linked to Lot at any point, he is treated as a pathetic figure and remains secondary in conventional commentary. Many commentaries denigrate the character of Lot, often in direct comparisons with Abraham. My reading of the texts of Genesis attempts to free the story of Lot from the constraints imposed by the way the story of Abraham functions. A careful reading of the Genesis account shows that Lot and Abraham exhibit similar elements of moral ambiguity, and Genesis contains no statement that condemns Lot on moral or religious grounds. Genesis 19, the single narrative in which Lot appears independently of Abraham, participates in the dialectic elsewhere in Tanakh. On the basis of a consistent pattern of action and speech throughout the first portion of Genesis 19, I advance my own original conception of the eirenic viewpoint of the narrator concerning Lot and his relationship to the divine. I attempt to demonstrate ways in which the story of Lot critiques or deconstructs the dominant ideology centred upon Abraham. My conception of the particular interests of the compiler of Genesis 19 is supported by several intertextual studies. These include the traditions of Sodom and of Zoar, the story of hospitality in Judges 19, the story of the deluge (Genesis 6-9) and stories of women who, like Lot’s daughters, act to continue the family line. In a treatment of the history of Lot traditions, I find evidence to separate the story of Lot from the work of the Yahwist. I consider whether the stories of Lot have a derivation east of the Jordan and whether the stories were of particular interest to the Deuteronomists. In the final chapter of this study, I focus on the main themes of the narratives concerning Lot and Abraham, and Moab and Ammon and Israel. The question of social boundaries arises in regard to many of these themes, such as the interaction of female and male, the role of wealth, the relation of city and country, kinship, and rights to land settlement. In this way, the treatment of Lot and Abraham in Tanakh and in subsequent traditions offers a perspective upon the formation of identity in the contemporary world of religious plurality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a multi-level system dynamics (SD) / discrete event simulation (DES) approach for assessing planning and scheduling problems within an aviation training continuum. The aviation training continuum is a complex system, consisting of multiple aviation schools interacting through interschool student and instructor flows that are affected by external triggers such as resource availability and the weather.
SD was used to model the overall training continuum at a macro level to ascertain relationships between system entities. SD also assisted in developing a shared understanding of the training continuum, which involves constructing the definitions of the training requirements, resources and policy objectives. An end-to-end model of the continuum is easy to relate to, while dynamic visualisation of system behaviour provides a method for exploration of the model.
DES was used for micro level exploration of an individual school within the training continuum to capture the physical aspects of the system including resource capacity requirements, bottlenecks and student waiting times. It was also used to model stochastic events such as weather and student availability. DES has the advantage of being able to represent system variability and accurately reflect the limitations imposed on a system by resource constraints.
Through sharing results between the models, we demonstrate a multi-level approach to the analysis of the overall continuum. The SD model provides the school’s targeted demand to the DES model. The detailed DES model is able to assess schedules in the presence of resource constraints and variability and provide the expected capacity of a school to the high level SD model, subjected to constraints such as instructor availability or budgeted number of training systems. The SD model allows stakeholders to assess how policy and planning affect the continuum, both in the short and the long term.
The development of this approach permits moving the analysis of the continuum between SD and DES models as appropriate for given system entities, scales and tasks. The resultant model outcomes are propagated between the continuum and the detailed DES model, iteratively generating an assessment of the entire set of plans and schedule across the continuum. Combining data and information between SD and DES models and techniques assures relevance to the stakeholder needs and effective problem scoping and scaling that can also evolve with dynamic architecture and policy requirements.
An example case study shows the combined use of the two models and how they are used to evaluate a typical scenario where increased demand is placed on the training continuum. The multi-level approach provides a high level indication of training requirements to the model of the new training school, where the detailed model indicates the resources required to achieve those particular student levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scientific workflow offers a framework for cooperation between remote and shared resources on a grid computing environment (GCE) for scientific discovery. One major function of scientific workflow is to schedule a collection of computational subtasks in well-defined orders for efficient outputs by estimating task duration at runtime. In this paper, we propose a novel time computation model based on algorithm complexity (termed as TCMAC model) for high-level data intensive scientific workflow design. The proposed model schedules the subtasks based on their durations and the complexities of participant algorithms. Characterized by utilization of task duration computation function for time efficiency, the TCMAC model has three features for a full-aspect scientific workflow including both dataflow and control-flow: (1) provides flexible and reusable task duration functions in GCE;(2) facilitates better parallelism in iteration structures for providing more precise task durations;and (3) accommodates dynamic task durations for rescheduling in selective structures of control flow. We will also present theories and examples in scientific workflows to show the efficiency of the TCMAC model, especially for control-flow. Copyright©2009 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coal handling is a complex process involving different correlated and highly dependent operations such as selecting appropriate product types, planning stockpiles, scheduling stacking and reclaiming activities and managing train loads. Planning these operations manually is time consuming and can result in non-optimized schedules as future impact of decisions may not be appropriately considered. This paper addresses the operational scheduling of the continuous coal handling problem with multiple conflicting objectives. As the problem is NP-hard in nature, an effective heuristic is presented for planning stockpiles and scheduling resources to minimize delays in production and the coal age in the stockyard. A model of stockyard operations within a coal mine is described and the problem is formulated as a Bi- Objective Optimization Problem (BOOP). The algorithm efficacy is demonstrated on different real-life data scenarios. Computational results show that the solution algorithm is effective and the coal throughput is substantially impacted by the conflicting objectives. Together, the model and the proposed heuristic, can act as a decision support system for the stockyard planner to explore the effects of alternative decisions, such as balancing age and volume of stockpiles, and minimizing conflicts due to stacker and reclaimer movements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The growing computational power requirements of grand challenge applications has promoted the need for merging high throughput computing and grid computing principles to harness computational resources distributed across multiple organisations. This paper identifies the issues in resource management and scheduling in the emerging high throughput grid computing context. We also survey and study the performance of several space-sharing and time-sharing opportunistic scheduling policies that have been developed for high throughput computing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the years many researchers have been investigating the area of MRP production planning and still is an area of high interest today. This paper look at production planning where there is unpredictable demands due to the type of product and market a company produces for. Production planning becomes difficult when demand fluctuates unpredictably, and hence a historical sales forecast is used as the initial data for production planning. The sales from previous years, especially in a seasonal market, don't necessarily correlate well to the current and future sales for the next year. A planner working in such environment would have himself frustrating to create a feasible production plan that not only needs to meet customer's demands but also to built up the 'correct' amount of stock for that peak sell season. To overcome some of these problems, This work describes a production planning methodology that can be implemented robustly and quickly. This paper has studied two multi-item lot-sizing problems. We detailed the development of the planning problem mathematically and highlight some solutions to initial problems investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an enterprise grid computing environments, users have access to multiple resources that may be distributed geographically. Thus, resource allocation and scheduling is a fundamental issue in achieving high performance on enterprise grid computing. Most of current job scheduling systems for enterprise grid computing provide batch queuing support and focused solely on the allocation of processors to jobs. However, since I/O is also a critical resource for many jobs, the allocation of processor and I/O resources must be coordinated to allow the system to operate most effectively. To this end, we present a hierarchical scheduling policy paying special attention to I/O and service-demands of parallel jobs in homogeneous and heterogeneous systems with background workload. The performance of the proposed scheduling policy is studied under various system and workload parameters through simulation. We also compare performance of the proposed policy with a static space–time sharing policy. The results show that the proposed policy performs substantially better than the static space–time sharing policy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advent of commodity-based high-performance clusters has raised parallel and distributed computing to a new level. However, in order to achieve the best possible performance improvements for large-scale computing problems as well as good resource utilization, efficient resource management and scheduling is required. This paper proposes a new two-level adaptive space-sharing scheduling policy for non-dedicated heterogeneous commodity-based high-performance clusters. Using trace-driven simulation, the performance of the proposed scheduling policy is compared with existing adaptive space-sharing policies. Results of the simulation show that the proposed policy performs substantially better than the existing policies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While sex and socio-economic disparities in physical activity have been well documented, not all disadvantaged women are inactive. This study aimed to examine correlates of achieving recommended levels of physical activity among women of low socio-economic position. In 2005, a population-based sample of 291 women with low educational attainment provided survey data on leisure time physical activity (LTPA). Participants reported potential personal (enjoyment and self-efficacy; barriers; intentions; guilt and priorities; routines and scheduling; occupational physical activity; television viewing), social (support from family/friends; social participation; sport/recreation club membership; dog ownership) and environmental (aesthetics; safety; local access; footpaths; interesting walks; busy roads to cross; heavy traffic) correlates of physical activity. Nearly 40% of participants achieved recommended LTPA (150 min week–1). Multivariable analyses revealed that higher levels of self-efficacy for walking [prevalence ratio (PR) 2.05, 95% confidence interval (CI) 1.19–3.53], higher enjoyment of walking (PR 1.48, 95% CI 1.04–2.12), greater intentions to be active (PR 1.97, 95% CI 1.12–3.45) and having set routines for physical activity (PR 1.91, 95% CI 1.18–3.09) were significantly associated with achieving recommended LTPA. Personal factors were the characteristics most strongly associated with achieving recommended levels of LTPA among women from socio-economically disadvantaged backgrounds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational costs. This paper presents vision, challenges, and architectural elements for energy-efficient management of Cloud computing environments. We focus on the development of dynamic resource provisioning and allocation algorithms that consider the synergy between various data center infrastructures (i.e., the hardware, power units, cooling and software), and holistically work to boost data center energy efficiency and performance. In particular, this paper proposes (a) architectural principles for energy-efficient management of Clouds; (b) energy-efficient resource allocation policies and scheduling algorithms considering quality-of-service expectations, and devices power usage characteristics; and (c) a novel software technology for energy-efficient management of Clouds. We have validated our approach by conducting a set of rigorous performance evaluation study using the CloudSim toolkit. The results demonstrate that Cloud computing model has immense potential as it offers significant performance gains as regards to response time and cost saving under dynamic workload scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on a study into pre-service teachers’ perceptions about their professional development during practicum. The study examined to what extent, and how effectively, one group of pre-service teachers was able to integrate theory and practice during a three-week practicum in the first year of their degree. Data for this mixed methods study were drawn from one cohort of first-year students undertaking the Master of Teaching (MTeach), a graduate-level entry program in the Faculty of Education at an urban Australian university. Although there is a strong field of literature around the practicum in pre-service teacher education, there has been a limited focus on how pre-service teachers themselves perceive their development during this learning period. Further, despite widespread and longstanding acknowledgement of the “gap” between theory and practice in teacher education, there is still more to learn about how well the practicum enables an integration of these two dimensions of teacher preparation. In presenting three major findings of the study, this paper goes some way in addressing these shortcomings in the literature. First, opportunities to integrate theory and practice were varied, with many participants reporting supervision and scheduling issues as impacting on their capacity to effectively enact theory in practice. Second, participants’ privileging of theory over practice, identified previously in the literature as commonly characteristic of the pre-service teacher, was found in this study to be particularly prevalent during practicum. Third, participants overwhelmingly supported the notion of linking university coursework assessment to the practicum as a means of bridging the gap between, on the one hand, the university and the school and, on the other hand, theory and practice. The discussion and consideration of findings such as those reported in this paper are pertinent and timely, given the ratification of both the National Professional Standards for Teachers and the Initial Teacher Education Program Standards by the Australian Federal Government earlier this year. Within a number of the seven Professional Standards, graduate teachers are required to demonstrate knowledge and skills associated with both the theory and practice of teaching and with their effective integration in the classroom. To be nationally accredited, pre-service teacher education programs must provide evidence of enabling pre-service teachers to acquire such knowledge and skills.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the primary issues associated with the efficient and effective utilization of distributed computing is resource management and scheduling. As distributed computing resource failure is a common occurrence, the issue of deploying support for integrated scheduling and fault-tolerant approaches becomes paramount importance. To this end, we propose a fault-tolerant dynamic scheduling policy that loosely couples dynamic job scheduling with job replication scheme such that jobs are efficiently and reliably executed. The novelty of the proposed algorithm is that it uses passive replication approach under high system load and active replication approach under low system loads. The switch between these two replication methods is also done dynamically and transparently. Performance evaluation of the proposed fault-tolerant scheduler and a comparison with similar fault-tolerant scheduling policy is presented and shown that the proposed policy performs better than the existing approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper will document the initial discrete-event simulation performed to study a proposed change from a push to a pull system in an existing manufacturing company. The system is characterised by five machine lines with intermediate buffers, and five major part groupings. A simulation model has been developed to mimic the flow of kanban cards in the physical system, by using a series of requests that propagate back through the facility, which the machines must respond to. The customer
demand therefore controls the level of activity in the plant. The results of the initial modelling steps will be presented in this paper, especially the impact of kanban lot size and demand variability on the output and stability of the production system, from which a set of future work is proposed.